![]() System and method for estimating tunas caught by species on board fishing vessels (Machine-translati
专利摘要:
System and method for estimating tunas caught by species on board fishing vessels. The system comprises: - image capture means (10) for capturing color images of tunas in transit through a passage zone (40); - lighting means (20) for illuminating the passage area (40) with diffuse light; - a control module (30) for: - extract the silhouette of the individuals in the images; - generate the hidden parts of partially hidden individuals; - obtain geometric parameters of each individual by means of correspondence of geometric patterns; - obtain color parameters of each individual, extracting the percentage of color per band; - estimate the species of tuna to which each individual belongs, using the geometric and coloration parameters of each tuna as input to an artificial neural network that relates the morphological, silhouette and color characteristics with tuna species; - estimate the tunas caught by species. (Machine-translation by Google Translate, not legally binding) 公开号:ES2552397A1 申请号:ES201430791 申请日:2014-05-27 公开日:2015-11-27 发明作者:Joaquín GRACIA SALVADOR;Iñaki MINIÑO ARBILLA;Carlos SÁNCHEZ PLAZA;Alma ROMÁN LADRA;Antonio CUEVAS IGLESIAS 申请人:Calvopesca Sa;Calvopesca S A;Tecnologia Marina Ximo S L;TECNOLOGIA MARINA XIMO SL; IPC主号:
专利说明:
5 10 fifteen twenty 25 30 35 40 Four. Five fifty System and method for estimating tuna caught by species on board fishing vessels. Field of the invention The present invention is encompassed within the field of tuna fishing in tuna vessels, and more specifically in the systems for handling and storage of tuna fish in such vessels. Background of the invention Tropical tuna fishing and, above all, what is called industrial is based on the use of purse seiners. These vessels are designed to catch tuna and freeze it in situ. Tuna vessels are purse seiners that directly freeze tuna in their warehouses. These wineries, called "vats", carry frozen fish serving as a cooling system as well as storage and transportation. The operational capacity of tuna vessels is measured in several parameters including the capacity of their warehouses and the speed at which capture maneuvers are performed. When the fish is fenced and is inside the network flake, it is extracted from the net. The fish is taken out of the water by means of a kind of shovel / funnel called “salabardo.” This large funnel of more than two meters in diameter is handled by the cranes of the ship. These cranes are moved by the hydraulic system of the ships. fish of different sizes (sizes) and species appear generally mixed in these salabardos. The salabardo goes up and is deposited in a hopper that is directed from the main deck to the fishing park. In the fishing park there are tapes that transport the fish to the freezing and storage tanks. The tanks, which can have capacities of up to 120 tons of tuna, normally contain a mixture of salt water, called brine, which is about -1 ° C. The tuna enters into tanks and is kept for about three or four days until It is completely frozen. Then, in some boats the brine is usually extracted from the tank leaving the fish dry and lowering the temperature to be able to transport the fish at temperatures between -18 and -22 ° C. The fried in the vats is transmitted through some metal coils containing refrigerant fluid. As of today, the number of tunas that are caught, as well as their species or sizes, are only known after unloading in the ports. At capture time, only the experience of the fishing pattern or certain estimates of the scientific observers on board can provide some estimate of catches (species, number and sizes) by set and by tide. Having a full warehouse does not always ensure a good wage if the species or size of the specimens is not the one demanded by the market. It also lacks on board ships an automatic and effective tool that determines the presence of other non-target species within the fishery. 5 10 fifteen twenty 25 30 35 40 Four. Five fifty There are several international organizations that ensure compliance with legislation in this area, and fishing companies are the first interested in trying to comply with the provisions and minimize this type of catch, due not only to their responsibility in marine conservation but also to the numerous administrative sanctions that shipowners can assume they accidentally capture certain species of birds, turtles, sharks or dolphins. The present invention allows, by means of artificial vision techniques, to estimate the catches made, detecting number of specimens by species and by size, and based on that information, optimize current production processes, provide added value to the fishery and increase the benefit of companies (and consequently of the entire value chain). Although the applications of artificial vision are not new in its application for fish processing, there are no known specific developments capable of analyzing recently caught fish species and estimating their number and weight per species, reporting everything that is not tuna , in real time and aboard ships. Thus, for example, patent documents US2008137104-A1, EP2559336-A1, CH701341-A2, WO2012008843-A1 and US4934537-A disclose the use of artificial vision techniques to automatically identify fish. In them cameras and lighting means are used to obtain images, which are analyzed and different parameters of the fish are obtained, such as biomass, dimensions of the fish (length, thickness), weight and defects or diseases. In most cases the fish are driven by conveyor belts, or they move freely in the water, when identification and classification is done. However, in all of them said identification of fish parameters is carried out for individual fish, so that there has been a separation process by which the individual has been separated from the rest of the fish, in order to perform the identification, which would not be applicable to a tuna vessel, where all the fish are taken to the mass vessels, all together, without the possibility of separating them. In addition, none of them carries out an identification of the tuna species, only with parameters that are easier to identify, such as weight, volume and deformities. In fact, the identification of the tuna species is an extremely difficult task, which is carried out manually by visual inspection of operators. Patent document CN102749361-A discloses an automatic method to identify the species of tuna, based on fish chopping, heating of the fish and analysis of the emitted gas, this being a complex, expensive and destructive method, since it requires chopped fish There are no known methods for identifying species of tunids by artificial vision techniques, given the enormous similarity between the species. In addition, the inventions known in the previous patent documents do not allow to solve the existing needs in an industrial tuna fishing vessel, because they do not even allow vaguely estimate the amount and type of fish caught, in addition to not complying with the requirements of yield per ton / h, not reaching a few percentages of effectiveness or not being able to meet the extreme environmental conditions of work, with the consequent technical and operational problems. The present invention solves said problems. 5 10 fifteen twenty 25 30 35 40 Four. Five fifty Description of the invention The present invention presents a solution that overcomes the limitations described above, being the first artificial vision system capable of reporting in real time the amount of tuna fish in each set including number of specimens per species, kilos for each species and grouping by sizes within each species In this way you can get to make a classification on board, at the origin of the fish, inform customers and manage in advance their download and subsequent transfer to the areas or destinations where it is most quoted. A first aspect of the present invention relates to a method for estimating tuna caught per species on board fishing vessels. The method comprises: - capture a sequence of color images of tunas captured in transit through a passage area illuminated by diffused light; - process each captured image to: • extract, through segmentation techniques, the silhouette of each individual present in the image; • generate by interpolation the hidden parts of individuals that are partially hidden in the image; • make a geometric identification of each individual through a process of correspondence of geometric patterns, obtaining geometric parameters of each individual; • extract the percentage of color per band defined in each individual, obtaining coloration parameters for each individual; • introduce an input vector with the geometric and coloration parameters obtained for each individual in an artificial neural network that establishes a relationship between the morphological, silhouette and color characteristics of each individual with different species of tunas, to estimate the species of tuna to which the individual belongs; - Once all the images have been processed, estimate the different tunas caught by species. In a preferred embodiment, the method also includes the estimation of the size and weight of each individual detected in the image from the calculations of its silhouette and shape. Image segmentation preferably includes subtraction and thresholding processes. The process of generating the hidden parts of tunas is preferably carried out when said hidden parts represent a percentage of the area of the tuna below a certain overlap threshold. In a preferred embodiment, the artificial neural network is of the multilayer perceptron type and is trained by the RPROP supervised learning algorithm. The artificial neural network can establish a relationship between the morphological, silhouette and color characteristics of each individual not only with different species of tunas, but also with different species, in order to obtain so! bycatch The method preferably comprises a stage of dynamic identification of individuals in the sequence of images captured by which a follow-up and identification is carried out in the different images of the silhouettes of those individuals who already 5 10 fifteen twenty 25 30 35 40 Four. Five fifty have been analyzed, thus avoiding duplication in the analysis of the same individual. A second aspect of the invention relates to a system for estimating tuna caught per species on board fishing vessels. The system includes: - means for capturing images configured to capture a sequence of color images of tunas caught in their transit through a passage zone; - lighting means configured to illuminate the passage area by light diffuse; - a control module with data processing means configured to: • process each captured image to: - extract, through segmentation techniques, the silhouette of each individual present in the image; - generate by interpolation the hidden parts of individuals that are partially hidden in the image; - perform a geometric identification of each individual through a process of correspondence of geometric patterns, obtaining geometric parameters of each individual; - extract the percentage of color per band defined in each individual, obtaining coloration parameters of each individual; - introduce an input vector with the geometric and coloring parameters obtained for each individual in an artificial neural network that establishes a relationship between the morphological, silhouette and color characteristics of each individual with different species of tunas, to estimate the species of tuna to which the individual belongs; • Once all the images have been processed, estimate the different tunas caught by species. The data processing means of the control module can be additionally configured to estimate the size and weight of each individual detected in the image from the calculations of its silhouette and shape. A further aspect of the present invention relates to a method for identifying the species of tuna from a color image, which comprises: - extract, through background segmentation techniques, the tuna silhouette; - make an identification by form of the tuna by means of a process of correspondence of geometric patterns, obtaining geometric parameters of the tuna; - extract the percentage of color per band defined in the tuna, obtaining tuna color parameters; - introduce an input vector with the geometric parameters and coloring of the tuna in an artificial neural network that establishes a relationship between the morphological, silhouette and color characteristics of the tuna with different species of tuna, obtaining at the exit of the neural network the species of concrete tuna. This method is similar to the method for estimating tunas caught by species on board fishing vessels, with the difference that a single image of an individual tuna can be applied. Likewise, this method can include the estimation of the size and weight of each individual detected in the image from the calculations of its silhouette and shape. Likewise, the present invention provides reliable and real-time information on the accidental catches produced by identifying their species and the time of capture. This information, once the system is validated by different organizations 5 10 fifteen twenty 25 30 35 40 Four. Five fifty international, to complement and / or replace the control work carried out today by international inspectors on board all the fishing vessels of the tuna fleets. The present invention eliminates human subjectivity through the use of artificial vision techniques and the application of autonomous processing and decision algorithms. It also allows a quick and standardized implantation to the rest of the world tuna fleet. The objective of the present invention is therefore a system and method for the automatic detection of different species of tunas based on the capture of images in the fishing park of a commercial vessel, and the application on these of profile detection algorithms , silhouettes and colors, as well as algorithms for estimating the size, size and weight of said individuals. The algorithms allow a classification that obeys the reality of the captured species. Brief description of the drawings A series of drawings that help to better understand the invention and that expressly relate to an embodiment of said invention that is presented as a non-limiting example of this is described very briefly below. Figure 1 shows the artificial vision subsystem. Figure 2 shows the controlled lighting subsystem. Figure 3 shows the module for control, storage and data management. Figures 4A, 4B and 4C show several views of the entire assembled system. Figure 5 shows an image obtained from the subtraction and thresholding process. Figure 6 shows an image of the tuna contour obtained by segmentation techniques. Figure 7 shows the architecture of the neural network. Figures 8A and 8B show the obtaining of reference points for the extraction of geometric parameters in two individuals of different tuna species. Detailed description of the invention Mainly, the system integrates the following subsystems: an artificial vision subsystem with image capture means 10 for the capture of tuna images, controlled lighting means 20 and a control module 30 with data processing means responsible for the control of the process, storage and data management. Figure 1 schematically shows the artificial vision system. The inspection area is illuminated in this case by the controlled lighting means 20 shown in Figure 2. The purpose of this element is to isolate the field of work of the artificial vision system from ambient lighting and ensure the most appropriate conditions for the Image capture The vision structure is designed to fit the dimensions of the tuna trays or conveyor belts 5 10 fifteen twenty 25 30 35 40 Four. Five fifty In a fishing park. In a preferred embodiment, the lighting means 20 are composed of four 18-22 watt neon tubes, generating diffuse lighting with a color temperature between 5000-6500 ° K, fed by high-frequency electronic reactances. These lighting points concentrate electromagnetic radiation in the space where the tunas pass to sample. The lighting used is diffuse, a softer lighting that does not generate pronounced shadows, without the intensity or the glare of direct light. It is spread over the visual field of capture, so it seems to involve tunas. Figure 3 shows the control module 30, also responsible for data storage and management. This control module 30 comprises data processing means that receive the images captured by the cameras and perform the processing thereof, which will be detailed later. Figures 4A, 4B and 4C show several views (front, side and floor, respectively) of the system assembled on a tuna passage area 40 (e.g. a tray, ramp or a conveyor belt). For simplicity, the control module 30 is not shown in these figures, which will be received by the images captured by the image capture means 10 for analysis. The system is composed of a support structure 50, scanning arch, which integrates the image capture means 10 and the illumination means 20. The support structure 50 is installed in one of the tuna passage areas 40 in the fishing park, inside a fishing vessel, either on the transport belt or on the discharge slides towards the congelation tanks. The installation, in principle, can be fixed or mobile, in order to be able to assemble and disassemble the modules necessary for inspection, depending on whether or not they are necessary. A set of lighting points 20 of the same or different geometry with same or different emission patterns, spatial and spectral, arranged around the inspection zone, illuminate the scanning area where the tunas pass, while a set of cameras 10, located in front of the inspection area, on the plane of the area to be scanned, with the same or different field of vision, they capture color contrast images by reflectance as well as line profile maps, of the set of tunas from various points. The arrangement of the lighting points is such that it allows a homogeneous illumination of the area. The images acquired by the cameras feed the data processing means of the control module 30 which, through the application of specific algorithms for the detection of physiological and morphological characteristics of tunas, classify the analyzed set. The algorithms allow the detection of tunid species and the estimation of weights and sizes. In the detection, the algorithms process the image and extract a series of correlation parameters, pixel count and derivative values. This data is stored in a database that incorporates the system. The data processing means can process different tunas within the same scan capture. Additionally, the control module 30 can allow the processing means to be managed, comprising, on the one hand, user interfaces that show the captured images and the detection results obtained, and, on the other, a database that stores the results obtained by the system. The automatic inspection system works as follows: - Tuna caught from the fishing gear of the boat slide down the distribution channel to the freezer vats in the fishing park 5 10 fifteen twenty 25 30 35 40 Four. Five fifty of the boat. A sensor detects the passage of tunas, informing the control module 30 so that the data capture process is initiated. - The computer modules activate the registration and storage of images, making progressive captures in an optical window (in a preferred embodiment, an approximate window of 1000 X 1000 millimeters). - The sequence of 2D images is stored indexed, for example in physical memory units SSD integrated in the equipment. The stored images include information on various parameters of brightness, exposure time and others corresponding to each photograph. The device switches to standby when the sensor detects that tunas no longer pass through the scanning area. The set of images and data obtained is stored in a self-generated folder with date, time, vessel name, tide, fishing area, cast number, etc. - The computerized image processing application of the control module 30 is started manually or automatically (defined by the user) after the finalization of image capture. After the complete processing of the images by means of algorithms and artificial neural network developed specifically for this invention, they are obtained for the capture set: • Estimation of the total number of tunas caught. • Estimation of units by predefined size / weight groups. • Percentage of tunas by species, by group and total. The image processing algorithm is detailed below. Each image captured, with the associated data, is processed by a set of algorithms expressly developed and adjusted for this application. Specifically, the following processes are carried out: to. Segmentation through several algorithmic processes of subtraction and thresholding of tunas and the bottom of each catch. Figure 5 shows, by way of example, a binary image obtained as a result of the subtraction and thresholding process, where several fish 1 can be seen. b. Marking for interpolation in semi-hidden tunas. C. Calculation of the percentage of color in the whole frame, by defined band range. As a result, the following information is obtained per frame: ■ Total number of tunas. ■ Total volume and weight. ■ Color percentages per band. Next, each tuna captured in the image is processed for the extraction of silhouette and percentage of color. Figure 6 shows, in a binary image, the outline or silhouette of a tuna obtained by segmentation techniques. The following processes are subsequently applied by extracted and segmented tuna: to. Interpolation and generation of hidden parts in tunas whose hidden parts represent a percentage equal to or less than a certain overlap threshold (in a preferred embodiment, 25% of the tuna, although this percentage may vary). 5 10 fifteen twenty 25 30 35 40 Four. Five fifty The system integrates an algorithm that after detecting a contour and segmenting a complete tuna or part of it, the first thing it does is to track the contour of the tuna in the obtained segmentation. If the perimeter contour tracking is complete, it is computed as a complete copy. If any or some of the parts that do not allow closing the perimeter of the tuna contour are missing, the first thing that is done is an estimate of the number of hidden sections, then a calculation of the total percentage of hidden silhouette and in a third step a comparison with a base of models to determine if with the percentage of tuna visualized and segmented the missing parts can be interpolated and thus obtain a complete copy that will be passed directly to the following species and size detection processes. If in this process you cannot recreate a copy of tuna through interpolation, these “pieces” or sections of tuna are integrated into one to compute volume that will be added to the volumes of cases in which a copy cannot be recreated . These volumes without “owner” are computed as a general volume and can be associated by programming to the most numerous species (for example, or the one with the lowest commercial value) in the current inspection process. According to the experience acquired in tuna inspection, a hidden percentage greater than 25% in small fish prevents regeneration, although in larger fish the percentage of concealment may be higher. Other aspects are also involved, such as the hidden area, the number of hidden parts, etc. b. Process of correspondence of geometric patterns (“Geometric Model Finder” - GMF). C. Processing of the data obtained from the deformation pattern to obtain shape and contour indicators. d. Processing of the percentage of pigmentation-color per band, obtaining a colorimetric profile per specimen. The present invention performs a dynamic identification of individuals on the scene, which is necessary to identify, for each capture, which individuals have been processed and thus avoid duplication in the analysis. The system performs a sequential matrix image capture and the identification and marking of the different individuals that appear in the scene and that, when moving automatically on the conveyor belts, trays or ramps, continue to appear in successive captures. This identification is made based on the appearance, displacement (usually linear) of the individuals in the scene, and their subsequent disappearance. An iterative algorithm is used for the correct identification. For each image captured from the scene the following steps apply: i. Capture and record image. ii. Elimination of the background of the image: The background of the image (if visible) is removed, in order to leave only the individuals present in the scene. iii. Comparison with the list of silhouettes of individuals previously registered in the previous capture: If there is a collection of individuals already identified on the scene, from the previous image, a correlation is applied between the current image and the list of silhouettes of previously identified individuals. 5 10 fifteen twenty 25 30 35 40 Four. Five fifty iv. Elimination of the list of silhouettes of those silhouettes corresponding to individuals partially or totally outside the inspection scene: The list of silhouettes of individuals in the scene is maintained throughout the dynamic identification process in order to identify those areas of the image corresponding to individuals already analyzed. As soon as an individual has left, totally or partially, of the scene, it is assumed that it has been due to the displacement of the conveyor belt, being understood that this individual is leaving the scene or has already totally left. This means that, previously, the individual has been analyzed, and when leaving the scene, it is no longer necessary to keep his silhouette information in the list of identified individuals. v. Masking of areas corresponding to individuals already identified: A mask is constructed from the template of silhouettes of individuals with the translation values obtained by the correlation executed in the previous point. saw. Identification of complete individuals in the image: On the part of the unmasked image, new individuals that may have appeared are searched, discarding those individuals whose bodies are not contained in the image as a whole. vii. Classification of complete individuals: Once the bodies of complete individuals have been identified, their classification is carried out, using the silhouette and color models, assigning them to the corresponding species (in case of not being assigned to any class, the individual is marked as not identified (accidental capture)). viii Updating list of silhouettes of identified individuals: With the new collection of identified silhouettes, the list of identified individuals is updated, which will be taken into account for the analysis of the next captured image. ix. Capture a new image and start the iterative cycle: A new image is captured, and the processing cycle is repeated. In this iterative algorithm only those individuals who appear completely included in the scene are analyzed. An individual may be partially out of the picture. In this case, it is because the individual either enters the field of vision or leaves it, by the displacement of the tape. If an entry into the field of vision occurs, the individual will not be considered "parsable" until their bodily limits are within the scene. If the individual has already been analyzed, it will be incorporated as an entry in the dynamic list of silhouettes of analyzed individuals; in that case if the silhouette of the individual analyzed becomes partially (or totally) out of the picture, said entry is removed from the list. A critical step in making a correct dynamic identification of individuals is the segmentation (elimination) of the background of the image, that is, everything that does not matter when it comes to detecting silhouettes of individuals. After these processes, more than 80 characteristic parameters of contour, volume, shape, pigmentation, etc. are obtained by tuna. 5 10 fifteen twenty 25 30 35 40 Four. Five fifty After obtaining the mentioned characteristic parameters, these are automatically introduced into a neural network (Figure 7) that allows combining mathematical functions that simulate the way in which brain neurons connect to each other. In this way, an artificial neural network can be trained through interactive algorithms to model complex solution problems, establishing a relationship between a space of characteristics, in this case it includes all the morphological, silhouette and color characteristics of each individual, with Different species of tunas. The procedure of extraction of these characteristics allows the identification of an input vector to the neural network with more than 80 parameters that must achieve high efficiency after the training and validation of the network. Specifically, a set of characteristics are used: more than 20 geometric parameters and 80 coloration parameters. As for the geometric parameters, they are extracted, analyzed and processed directly: - Distances head / tail, head / dorsal fin, mouth / eye, among others. - Width in 5 sections of the individual divided from head to tail. - Angle defined by the tangent of the individual of the fish with respect to the horizontal. - Angle defined by the tangents of the silhouette on the nose and on the jaw of the fish. On the other hand, another algorithmic base of point extraction is used, which consists of the radial projection from the center of masses C of the projection of the individual, of various lines 2 on which the different reference points 3 are obtained, such and as shown in Figure 8A, belonging to a reference species used in development. As can be seen, the sampling carried out achieves a series of reference points 3 that are not equidistant from the surface of the tuna. In this way, the fact that the radial sampling used allows to obtain a non-uniform distribution of points makes it especially effective and very interesting, since when the distribution of the points is affected by the shape of the tuna, differences are maximized. between reference points 3 of the different tuna species. Figure 8B shows the sampling obtained in a different species, where it is observed that the distribution of reference points 3 changes considerably due to morphological differences. Regarding the coloration parameters, parameters of color percentages in each programmed spectral band are extracted and analyzed, the luminous intensity per band exceeded, all this also determining the area of the specimen where it is obtained. Contrasts are drawn at the perimeter of the specimen, in areas of fins and eye. This information obtained is combined to obtain a set of parameters or colorimetric indicators necessary to complete the species determination. The modeling procedure follows an aggregative scheme. For this equipment, the base neuron function for the network has been identified. Next the structure has been defined (always based on the architecture of the multilayer perceptron), conditioned by the number of input neurons (the characteristics) and the output (the tuna species). In order to offer an estimate of the prediction level of the network, the segmentation of the numerical range of network output (between 0 and 1, or 0% and 100%) is carried out in three 5 10 fifteen twenty 25 30 35 40 Four. Five fifty ranges, applying the confidence index label previously established: LOW: 0 - 60%; MEDIUM: 60 - 80% and HIGH: 80 - 100%. The training method used on the resulting network is the Resillient Backpropagation (RPROP). This method is a variation within the family of backpropagation algorithms, based on the delta rule, and first-order optimization. The convergence and robustness of this algorithm is superior to other algorithms of the same family. RPROP uses independent parameters that control the speed with which the objective function is traversed for each of the neural network weights. RPROP is also not affected by the saturation of neurons in the neural network, since only the derivative is used to determine the direction in the actualization of weights. Consequently, it converges faster than algorithms based only on backpropagation. The results obtained are stored in the equipment with the possibility of being exported, represented on screen or printed through the generation of customized reports. Therefore, the system object of the present invention bases its operation on the detection of the morphology (silhouette and pigmentation) of tunas. As the whole family of tunids has similar contour and shape, an artificial neural network is applied for the identification of the species. Neural networks are nothing more than a combination of mathematical functions that simulate the way in which brain neurons connect to each other. In this way, an artificial neural network can be trained using iterative algorithms to model complex solution problems, establishing a relationship between a characteristic space (in this case all the information obtained from the silhouette inspection and pigmentation) with the different classes (the different species of tunids and the rest of species). The present invention applies exclusively developed algorithms, based on correlation of silhouettes, estimation of volumes in multidimensional spaces of color representation from color images, being able to process fluorescence images, and segmentation and filtering of captured images. When identifying the species, a previous step is required consisting of separating the existing individuals in the background scene. This leads to dynamic segmentation and identification. During this preprocessing phase the images are subjected to different subtraction and thresholding processes that involve various mathematical operations. In the identification by form an algorithm is used based on the correspondence of geometric patterns against the conventional technology of search for patterns by normalized correlation in grayscale, the commonly called “NGC.” The system object of the invention is based on the correspondence of geometric patterns (Geometric Model Finder - GMF). The algorithm developed recognizes the geometry of each individual using a series of valid curves that do not correspond to a grid of pixels and then looks for similar shapes in the image without relying on specific levels of gray. The result is an improvement in the ability to locate individuals with high precision despite changes in lighting, scale, rotation, and other conditioning parameters in a commercial fishing tuna vessel. 5 10 fifteen twenty 25 30 35 40 For the development and creation of the "GMF" model, a pre-treatment of the images has been carried out to eliminate noise and they have also been thresholded in order to obtain a binary image with the silhouette of each species. Other of the processes carried out by the system is the classification by means of a red, green and blue (RGB) color detection model in the pigmentation of tunas. The model processes and delivers intensity levels, number of pixels, of each of the 3 bands mentioned. After obtaining the data from the previous algorithms, the neural network used includes all the morphological, silhouette and pigmentation characteristics of each individual. Preferably, the system is controlled by a control module comprised of the processing means so that the operator manages the system through a series of interfaces shown on the screen, preferably tactile that controls the operations of said system. The operator can select information for each set inspected, such as its identifier, the type of species detected, and the type of inspection to be performed. The control module can be configured so that at the end of an inspection session, the system issues a results report, where, in a graphic and statistical way, the results of the inspection are displayed. On the other hand, the control module has an internal database in which the results of the inspection of each set are stored, for later consultation and filtering according to various search criteria, by dates, by species, by sizes , etc. Parallel to the inspection in search of species, the system can also estimate the size, size and biomass of each individual, offering additional statistics regarding the total weight per cast, average weights and standard deviations. The results of the estimation of weights can also be included in the internal database of the system and can be exported at the operator's disposal. The size is extracted from the silhouette and shape calculations (frontal angle, tail, mandibular, eye-gill ratio, head width, belly, final section and tail, total length, head, tail, etc.). The weight, once the volume is obtained, is obtained by applying a calculation depending on the specific weight per cm3, which varies according to the specific species. Once the invention is clearly described, it is stated that the particular embodiments described above are subject to modifications in detail provided that they do not alter the fundamental principle and the essence of the invention.
权利要求:
Claims (10) [1] 5 10 fifteen twenty 25 30 35 40 Four. Five fifty 1. Method for estimating tuna caught by species on board fishing vessels, characterized in that it comprises: - capture a sequence of color images of tunas captured in transit through a passage area (40) illuminated by diffused light; - process each captured image to: • extract, through segmentation techniques, the silhouette of each individual present in the image; • generate by interpolation the hidden parts of individuals that are partially hidden in the image; • make a geometric identification of each individual through a process of correspondence of geometric patterns, obtaining geometric parameters of each individual; • extract the percentage of color per band defined in each individual, obtaining coloration parameters for each individual; • introduce an input vector with the geometric and coloration parameters obtained for each individual in an artificial neural network that establishes a relationship between the morphological, silhouette and color characteristics of each individual with different species of tunas, to estimate the species of tuna to which the individual belongs; - Once all the images have been processed, estimate the different tunas caught by species. [2] 2. Method according to revindication 1, characterized in that it includes estimating the size and weight of each individual detected in the image from the calculations of its silhouette and shape. [3] 3. Method according to any of the preceding claims, characterized in that the segmentation of the image includes subtraction and thresholding processes. [4] 4. Method according to any of the preceding claims, characterized in that the process of generating the hidden parts of tunas is carried out when said hidden parts represent a percentage of the area of the tuna below a certain overlap threshold. [5] 5. Method according to any of the preceding claims, characterized in that the artificial neural network is of the multilayer perceptron type. [6] 6. Method according to any of the preceding claims, characterized in that the artificial neural network establishes a relationship between the morphological, silhouette and color characteristics of each individual with different species of tunas and with other different species, to obtain as! bycatch [7] 7. Method according to any of the preceding claims, characterized in that the artificial neural network is trained by the RPROP supervised learning algorithm. [8] 8. Method according to any of the preceding claims, characterized in that it comprises a stage of dynamic identification of individuals in the sequence of images captured by which a follow-up and identification is carried out in the different images of the silhouettes of those individuals that have already been analyzed, avoiding 5 10 fifteen twenty 25 30 asl duplicities in the analysis of the same individual. [9] 9. System for the estimation of tunas caught by species on board fishing vessels, characterized in that it comprises: - image capture means (10) configured to capture a sequence of color images of tunas caught in their transit through a passage zone (40); - lighting means (20) configured to illuminate the passage area (40) by diffused light; - a control module (30) with configured data processing means for: • process each captured image to: - extract, through segmentation techniques, the silhouette of each individual present in the image; - generate by interpolation the hidden parts of individuals that are partially hidden in the image; - perform a geometric identification of each individual through a process of correspondence of geometric patterns, obtaining geometric parameters of each individual; - extract the percentage of color per band defined in each individual, obtaining coloration parameters of each individual; - introduce an input vector with the geometric and coloring parameters obtained for each individual in an artificial neural network that establishes a relationship between the morphological, silhouette and color characteristics of each individual with different species of tunas, to estimate the species of tuna to which the individual belongs; • Once all the images have been processed, estimate the different tunas caught by species. [10] 10. System according to revindication 9, characterized in that the data processing means of the control module (30) are additionally configured to estimate the size and weight of each individual detected in the image from the calculations of its silhouette and shape.
类似技术:
公开号 | 公开日 | 专利标题 Saberioon et al.2016|Automated multiple fish tracking in three-dimension using a structured light sensor Garcia et al.2020|Automatic segmentation of fish using deep learning with application to fish size measurement WO2019232247A1|2019-12-05|Biomass estimation in an aquaculture environment Zion et al.2008|Classification of guppies’| gender by computer vision Fan et al.2013|Automate fry counting using computer vision and multi-class least squares support vector machine Tseng et al.2020|Detecting and counting harvested fish and identifying fish types in electronic monitoring system videos using deep convolutional neural networks WO2011058529A4|2011-12-29|Method and system for the real-time automatic analysis of the quality of samples of processed fish meat based on the external appearance thereof, said samples travelling on a conveyor belt such that surface defects can be detected and the fish meat can be sorted according to quality standards ES2552397B1|2016-09-14|System and method for estimating tuna caught by species on board fishing vessels Mohamed et al.2020|Msr-yolo: Method to enhance fish detection and tracking in fish farms Beyan et al.2012|A filtering mechanism for normal fish trajectories Blok et al.2021|The effect of data augmentation and network simplification on the image‐based detection of broccoli heads with Mask R‐CNN Lainez et al.2019|Automated fingerlings counting using convolutional neural network Levy et al.2018|Automated analysis of marine video with limited data Wang et al.2019|Vision-based in situ monitoring of plankton size spectra via a convolutional neural network Coro et al.2021|An intelligent and cost-effective remote underwater video device for fish size monitoring Marti-Puig et al.2018|Quantitatively scoring behavior from video-recorded, long-lasting fish trajectories Jurado et al.2019|Electronic system for the detection of chicken eggs suitable for incubation through image processing Skøien et al.2014|A computer vision approach for detection and quantification of feed particles in marine fish farms WO2014053679A1|2014-04-10|Method and system for determining the freshness of fish, based on the processing of ocular images, and computer program implementing the method US11240994B2|2022-02-08|Video camera trap for ecological research ES2552405B1|2016-09-14|System and method of detection of Anisakis parasites in fish fillets Perrin et al.2006|A non-Bayesian model for tree crown extraction using marked point processes Xu et al.2020|Detection of Bluefin Tuna by Cascade Classifier and Deep Learning for Monitoring Fish Resources Soltanzadeh et al.2020|A Prototype System for Real-Time Monitoring of Arctic Char in Indoor Aquaculture Operations: Possibilities & Challenges Liu et al.2021|Multi-class fish stock statistics technology based on object classification and tracking algorithm
同族专利:
公开号 | 公开日 ES2552397B1|2016-09-14|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 EP0331390A2|1988-02-29|1989-09-06|Grove Telecommunications Ltd.|Fish sorting machine| WO1994009920A1|1992-10-23|1994-05-11|The Minister Of Agriculture Fisheries And Food In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland|Fish sorting machine| JPH07231733A|1993-09-30|1995-09-05|Hirobumi Matsuo|Apparatus for discriminating fish species| US20050025357A1|2003-06-13|2005-02-03|Landwehr Val R.|Method and system for detecting and classifying objects in images, such as insects and other arthropods| ES1056844U|2004-02-25|2004-05-16|Tacore, S.L.|Provision for classification and identification of fish species through artificial vision systems. |CN110334751A|2019-06-24|2019-10-15|北京理工华汇智能科技有限公司|For tying up the image processing method and device, terminal of node|
法律状态:
2016-09-14| FG2A| Definitive protection|Ref document number: 2552397 Country of ref document: ES Kind code of ref document: B1 Effective date: 20160914 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 ES201430791A|ES2552397B1|2014-05-27|2014-05-27|System and method for estimating tuna caught by species on board fishing vessels|ES201430791A| ES2552397B1|2014-05-27|2014-05-27|System and method for estimating tuna caught by species on board fishing vessels| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|